On Bagging and Estimation in Multivariate Mixtures

نویسنده

  • Reza Pakyari
چکیده

Two bagging approaches, say 1 2 n-out-of-n without replacement (subagging) and n-out-of-n with replacement (bagging) have been applied in the problem of estimation of the parameters in a multivariate mixture model. It has been observed by Monte Carlo simulations and a real data example, that both bagging methods have improved the standard deviation of the maximum likelihood estimator of the mixing proportion, whilst the absolute bias increased slightly. In estimating the component distributions, bagging could increase the root mean integrated squared error when estimating the most probable component.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Predicting Flow Number of Asphalt Mixtures Based on the Marshall Mix design Parameters Using Multivariate Adaptive Regression Spline (MARS)

Rutting is one of the major distresses in the flexible pavements, which is heavily influenced by the asphalt mixtures properties at high temperatures. There are several methods for the characterization of the rutting resistance of asphalt mixtures. Flow number is one of the most important parameters that can be used for the evaluation of rutting. The flow number is measured by the dynamic creep...

متن کامل

Bagging Binary and Quantile Predictors for Time Series: Further Issues

Bagging (bootstrap aggregating) is a smoothing method to improve predictive ability under the presence of parameter estimation uncertainty and model uncertainty. In Lee and Yang (2006), we examined how (equal-weighted and BMA-weighted) bagging works for onestep ahead binary prediction with an asymmetric cost function for time series, where we considered simple cases with particular choices of a...

متن کامل

Combining Classifiers based on Gaussian Mixtures

A combination of classification rules (classifiers) is known as an Ensemble, and in general it is more accurate than the individual classifiers used to build it. Two popular methods to construct an Ensemble are Bagging (Bootstrap aggregating) introduced by Breiman, [4] and Boosting (Freund and Schapire, [11]). Both methods rely on resampling techniques to obtain different training sets for each...

متن کامل

Efficiently Approximating Markov Tree Bagging for High-Dimensional Density Estimation

We consider algorithms for generating Mixtures of Bagged Markov Trees, for density estimation. In problems defined over many variables and when few observations are available, those mixtures generally outperform a single Markov tree maximizing the data likelihood, but are far more expensive to compute. In this paper, we describe new algorithms for approximating such models, with the aim of spee...

متن کامل

Adaptive Bayesian multivariate density estimation with Dirichlet mixtures

We show that rate-adaptive multivariate density estimation can be performed using Bayesian methods based on Dirichlet mixtures of normal kernels with a prior distribution on the kernel’s covariance matrix parameter. We derive sufficient conditions on the prior specification that guarantee convergence to a true density at a rate that is minimax optimal for the smoothness class to which the true ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008